翻訳と辞書
Words near each other
・ Intuit India
・ Intuit Interchange Format
・ Intuit Money Manager
・ Intuition
・ Intuition (Amiga)
・ Intuition (Angela Bofill album)
・ Intuition (Bergson)
・ Intuition (Bill Evans album)
・ Intuition (disambiguation)
・ Introduction to Christianity
・ Introduction to Commutative Algebra
・ Introduction to Cooperative Learning
・ Introduction to Destruction
・ Introduction to Economic Analysis
・ Introduction to eigenstates
Introduction to entropy
・ Introduction to evolution
・ Introduction to Film
・ Introduction to Finality
・ Introduction to gauge theory
・ Introduction to general relativity
・ Introduction to genetics
・ Introduction to Kant's Anthropology
・ Introduction to Leadership Skills for Crews
・ Introduction to Leadership Skills for Troops
・ Introduction to Life
・ Introduction to M-theory
・ Introduction to Magic
・ Introduction to Mathematical Philosophy
・ Introduction to Mayhem


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Introduction to entropy : ウィキペディア英語版
Introduction to entropy

The idea of "irreversibility" is central to the understanding of entropy. Everyone has an intuitive understanding of irreversibility (a dissipative process) - if one watches a movie of everyday life running forward and in reverse, it is easy to distinguish between the two. The movie running in reverse shows impossible things happening - water jumping out of a glass into a pitcher above it, smoke going down a chimney, water in a glass freezing to form ice cubes, crashed cars reassembling themselves, and so on. The intuitive meaning of expressions such as "you can't unscramble an egg", "don't cry over spilled milk" or "you can't take the cream out of the coffee" is that these are irreversible processes. There is a direction in time by which spilled milk does not go back into the glass.
In thermodynamics, one says that the "forward" processes – pouring water from a pitcher, smoke going up a chimney, etc. – are "irreversible": they cannot happen in reverse, even though, on a microscopic level, no laws of physics would be violated if they did. All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. For an irreversible process in an isolated system, the thermodynamic state variable known as entropy is always increasing. The reason that the movie in reverse is so easily recognized is because it shows processes for which entropy is decreasing, which is physically impossible. In everyday life, there may be processes in which the increase of entropy is practically unobservable, almost zero. In these cases, a movie of the process run in reverse will not seem unlikely. For example, in a 1-second video of the collision of two billiard balls, it will be hard to distinguish the forward and the backward case, because the increase of entropy during that time is relatively small. In thermodynamics, one says that this process is practically "reversible", with an entropy increase that is practically zero. The statement of the fact that the entropy of the Universe never decreases is found in the second law of thermodynamics.
In a physical system, entropy provides a measure of the amount of thermal energy that ''cannot'' be used to do work. In some other definitions of entropy, it is a measure of how evenly energy (or some analogous property) is distributed in a system. ''Work'' and ''heat'' are determined by a process that a system undergoes, and only occur at the boundary of a system. ''Entropy'' is a function of the state of a system, and has a value determined by the state variables of the system.
The concept of entropy is central to the second law of thermodynamics. The second law determines which physical processes can occur. For example, it predicts that the flow of heat from a region of high temperature to a region of low temperature is a spontaneous process – it can proceed along by itself without needing any extra external energy. When this process occurs, the hot region becomes cooler and the cold region becomes warmer. Heat is distributed more evenly throughout the system and the system's ability to do work has decreased because the temperature difference between the hot region and the cold region has decreased. Referring back to our definition of entropy, we can see that the entropy of this system has increased. Thus, the second law of thermodynamics can be stated to say that the entropy of an isolated system always increases, and such processes which increase entropy can occur spontaneously. The entropy of a system increases as its components have the range of their momentum and/or position increased.
The term ''entropy'' was coined in 1865 by the German physicist Rudolf Clausius, from the Greek words ''en-'', "in", and ''trope'' "a turning", in analogy with ''energy''.
==Explanation==
The concept of thermodynamic entropy arises from the second law of thermodynamics. By this law of entropy increase it quantifies the reduction in the capacity of a system for change, for example heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform or determines whether a thermodynamic process may occur.
Entropy is calculated in two ways, the first is the entropy change (ΔS) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the macroscopic relationship between heat flow into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system. The second calculates the absolute entropy (S) of a system based on the microscopic behaviour of its individual particles. This is based on the natural logarithm of the number of microstates possible in a particular macrostate (W or Ω) called the thermodynamic probability. Roughly the probability of the system being in that state. In this sense it effectively defines entropy independently from its effects due to changes which may involve heat, mechanical, electrical, chemical energies etc. but also includes logical states such as information.
Following the formalism of Clausius, the first calculation can be mathematically stated as:〔I. Klotz, R. Rosenberg, ''Chemical Thermodynamics - Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125〕
: S = \frac}
The concept of energy is related to the first law of thermodynamics, which deals with the conservation of energy and under which the loss in heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of this decrease in internal energy of the system and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.
The concept of entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Introduction to entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.